Low Resolution Scalar Quantization for Gaussian and Laplacian Sources with Absolute and Squared Error Distortion Measures

نویسندگان

  • Daniel Marco
  • David L. Neuhoff
چکیده

This report considers low resolution scalar quantization. Specifically, it considers entropyconstrained scalar quantization for memoryless Gaussian and Laplacian sources with both squared and absolute error distortion measures. The slope of the operational rate-distortion functions of scalar quantization for these sources and distortion measures is found. It is shown that in three of the four cases this slope equals the slope of the corresponding Shannon rate-distortion function, which implies that asymptotic low resolution scalar quantization with entropy coding is an optimal coding technique for these three cases. For the case of a Gaussian source and absolute error distortion measure, however, the slope at rate equal zero of the operational-rate distortion function of scalar quantization is infinite, and hence does not match the slope of the corresponding Shannon rate-distortion function. Consequently, scalar quantization is not an optimal coding technique for Gaussian sources and absolute error distortion measure. The results are obtained via analysis of uniform and binary scalar quantizers, which shows that in low resolution their operational rate-distortion functions, in all four cases, are the same as the corresponding operational rate-distortion functions of scalar quantization in general. Lastly, the slope of the Shannon rate-distortion function (the function itself is not known) at rate equal zero is found for a Laplacian source and squared error distortion measure.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized boundary adaptation rule for minimizing rth power law distortion in high resolution quantization

A new generalized unsupervised competitive learning rule is introduced for adaptive scalar quantization. The rule, called generalized Boundary Adaptation Rule (BARr), minimizes r-th power law distortion Dr in the high resolution case. It is shown by simulations that a fast version of BARr outperforms generalized Lloyd I in minimizing D1 (mean absolute error) and D2 (mean squared error) distorti...

متن کامل

Codecell Contiguity in Optimal Fixed-Rate and Entropy-Constrained Network Scalar Quantizers

We consider the properties of optimal xed-rate and entropy-constrained scalar quantizers for nite alphabet sources. In particular, w econsider conditions under which the optimal scalar quantizer with contiguous codecells achieves performance no worse than the optimal scalar quantizer without the constraint of codecell contiguit y.In addition to traditional scalar quantizers, w econsider multi-r...

متن کامل

Asymptotic analysis of optimal fixed-rate uniform scalar quantization

This paper studies the asymptotic characteristics of uniform scalar quantizers that are optimal with respect to mean squared error. It is shown that when a symmetric source density with infinite support is sufficiently well behaved, the optimal step size ∆N for symmetric uniform scalar quantization decreases as 2σ N V −1 1 / 6N 2 ( ) , where N is the number of quantization levels, σ 2 is the so...

متن کامل

Embedded Trellis Coded Quantization

Embedded Trellis Coded Quantization (E-TCQ) is introduced as an embedded quantization technique which achieves good rate-distortion performance for reasonable computational complexity. The performance of E-TCQ is investigated for memoryless Gaussian, Laplacian, and uniform sources. E-TCQ is shown to outperform multi-stage TCQ. For Gaussian and Laplacian sources the performance of E-TCQ shows la...

متن کامل

Suboptimality of the Karhunen-Loeve transform for fixed-rate transform coding

AbstractAn open problem in source coding theory has been whether the Karhunen-Loève transform (KLT) is optimal for a system that orthogonally transforms a vector source, scalar quantizes the components of the transformed vector using optimal bit allocation, and then inverse transforms the vector. Huang and Schultheiss proved in 1963 that for a Gaussian source the KLT is mean squared optimal in ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006